Estimating the variance of Shannon entropy
نویسندگان
چکیده
The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation information theoretical quantities, most notably Shannon entropy. To this purpose, possibly widespread tool is provided by so-called plug-in estimator, whose properties in terms bias and variance were investigated since first decade after publication Shannon's seminal works. In case an underlying multinomial distribution, while can be evaluated knowing support set size, far more elusive. aim present work to investigate, case, estimator a parameter that describes We then exactly determine probability distributions maximize parameter. results presented here allow upper limits uncertainty entropy assessments under hypothesis memoryless stochastic processes.
منابع مشابه
On the Estimation of Shannon Entropy
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...
متن کاملEstimating the Variance 1 Running head: ESTIMATING THE VARIANCE Estimating the Variance in Before-After Studies
Problem: To simplify the computation of the variance in before-after studies, it is generally assumed that the observed crash data for each entity (or observation) are Poisson distributed. Given the characteristics of this distribution, the observed value ( i x ) for each entity is implicitly made equal to its variance. However, the variance should be estimated using the conditional properties ...
متن کاملRényi Extrapolation of Shannon Entropy
Relations between Shannon entropy and Rényi entropies of integer order are discussed. For any N–point discrete probability distribution for which the Rényi entropies of order two and three are known, we provide an lower and an upper bound for the Shannon entropy. The average of both bounds provide an explicit extrapolation for this quantity. These results imply relations between the von Neumann...
متن کاملShannon Entropy Analysis of the Genome Code
This paper studies the chromosome information of twenty five species, namely, mammals, fishes, birds, insects, nematodes, fungus, and one plant. A quantifying scheme inspired in the state space representation of dynamical systems is formulated. Based on this algorithm, the information of each chromosome is converted into a bidimensional distribution. The plots are then analyzed and characterize...
متن کاملShannon entropy as a new measure of aromaticity, Shannon aromaticity.
Based on the local Shannon entropy concept in information theory, a new measure of aromaticity is introduced. This index, which describes the probability of electronic charge distribution between atoms in a given ring, is called Shannon aromaticity (SA). Using B3LYP method and different basis sets (6-31G**, 6-31+G** and 6-311++G**), the SA values of some five-membered heterocycles, C(4)H(4)X, a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical Review E
سال: 2021
ISSN: ['1550-2376', '1539-3755']
DOI: https://doi.org/10.1103/physreve.104.024220